|
Kumar Kshitij Patel
Research Fellow, Postdoctoral Associate (on leave until May), E-mail: kkpatel@ttic.edu |
I am a postdoctoral associate at Yale FDS. Currently I am on leave from Yale, to be a Simon's Research Fellow for the Spring semester program on Federated and Collaborative Learning. Before joining Yale, I was a PhD student at the Toyota Technological Institute at Chicago (TTIC), where I had the privilege of being advised by Prof. Nati Srebro and Prof. Lingxiao Wang. Throughout my research career, I have explored various facets of collaborative learning, focusing on proving theoretical guarantees for optimization and ensuring the privacy of distributed algorithms amid data and systems heterogeneity. Recently, I have been interested in examining the incentives that encourage agents to initiate and sustain these collaborations (our recent workshop).
For an (mostly) up-to-date list of my publications, please visit my Google Scholar profile. You can also access my CV here.
During Summer 2023, I worked with Nidham Gazagnadou and Lingjuan Lyu from the Privacy Preserving Machine Learning team at Sony AI in Tokyo, Japan as a research intern. During summer 2020, I worked with the amazing team at Codeguru, Amazon Web Services as an applied scientist intern. And before joining TTIC, I obtained my BTech in Computer Science and Engineering at Indian Institute of Technology, Kanpur. There I was fortunate to work with Prof. Purushottam Kar on Bandit Learning algorithms. I also spent a year of my undergraduate on an academic exchange at École Polytechnique Fédérale de Lausanne (EPFL) where I worked at the Machine Learning and Optimization Laboratory (MLO) with Prof. Martin Jaggi.
Really excited to be co-organizing a Simon's workshop on data heterogeneity with some wonderful people (people close to me know that I spend more time than I would like to admit thinking about data heterogeneity assumptions :p).
Our paper on federated training of Diffusion models with personalization and Local Differential Privacy got accepted at CVPR'26. An updated version coming to Arxiv soon (here is an older version).
Our paper on group DRO Linear Regression got accepted at ICLR'26. An updated version coming to Arxiv soon (here is an older workshop version).
Excited to move to Berkeley for the semester-long research program on Federated and Collaborative Learning at the Simon's institute.
Coming to San Diego to present our new paper on Local SGD at NeurIPS'25.
Really excited to be co-organizing a TTIC summer workshop with some wonderful people.
I have graduated from TTIC and moved to Yale FDS as a postdoctoral associate. My thesis, titled "What Makes Local Updates Effective: The Role of Data Heterogeneity and Smoothness," can be found here.
I co-organized a workshop on Learning from Heterogeneous Sources as part of the Simon's Spring semester program on Federated and Collaborative Learning.
I co-organized a workshop on Incentives for Collaborative Learning and Data Sharing at TTIC this summer.
I co-organized a workshop on Theoritical Advances in Federated Learning last summer (2023) at TTIC.
I co-taught a tutorial at UAI'23 titled Online Optimization meets Federated Learning.
I served/am serving as a reviewer for STOC'21, TMLR, JMLR, ICML'21'22'24, NeurIPS'21'22'23'24, ICLR'22'23'24, AISTATS'22'23, Springer MLJ, as a session chair for ICML'22, NeurIPS'22, and as a volunteer for IJCAI'24, ICML'20, ICLR'20. I received the top reviewer award at ICLR'22, ICML'22, NeurIPS'22.
I am participating in the NSF-Simon's research collaboration on the Mathematics of Deep Learning (MoDL).
I co-organized the TTIC Student Workshop 2021, with Gene Li. We also organized a TTIC/Uchicago student theory seminar in Spring 2021. If you'd like to take over and re-start this series, please let me know.
I was a Teaching Assistant for the Convex Optmization course at TTIC during Winter'22'24 and a co-organizer for the Research at TTIC Colloquium for Fall-Winter 2021.
I participated in the Machine Learning Summer School at Tübingen, Germany during summer 2020.